24 research outputs found

    Plasma equilibrium reconstruction of jet discharges using the imas modelling infrastructure

    Get PDF
    International audienceThe reconstruction of Tokamak plasma equilibrium is a fundamental step in the understanding of fusion plasma physics since it sets the starting point for all subsequent plasma modelling applications and experimental data interpretation. The verification and validation of the numerical codes used to reconstruct plasma equilibrium, using as many available input experimental data e.g. magnetic field or flux measurements, density and temperature diagnostics and polarimetry diagnostics, is essential both for physics model interpretation and when qualifying and extrapolating for ITER. In the framework of the EUROfusion Work Package on Code Development for Integrated Modelling, a scientific Kepler workflow for the reconstruction of Tokamak plasma equilibrium was prototyped, using the ITER Integrated Modelling and Analysis Suite (IMAS). The workflow can seamlessly use any sort of data from Tokamak experiments and call equilibrium reconstruction codes such as EQUAL, EQUINOX, NICE, EFIT++ and SDSS, all using the same physics and engineering data ontology and methods for accessing the data. In the paper, plasma equilibrium reconstructions on dedicated JET plasma dischargesare shownusing at first magnetic data only and subsequently considering also other constrains suchas Motional Stark Effect(MSE). Results with magneticsonly give a good qualitative and quantitative agreement between the codes while including MSE, as anticipated, a substantial improvementofthe core plasma profilesis achieved

    Recent EUROfusion Achievements in Support of Computationally Demanding Multiscale Fusion Physics Simulations and Integrated Modeling

    Get PDF
    Integrated modeling (IM) of present experiments and future tokamak reactors requires the provision of computational resources and numerical tools capable of simulating multiscale spatial phenomena as well as fast transient events and relatively slow plasma evolution within a reasonably short computational time. Recent progress in the implementation of the new computational resources for fusion applications in Europe based on modern supercomputer technologies (supercomputer MARCONI-FUSION), in the optimization and speedup of the EU fusion-related first-principle codes, and in the development of a basis for physics codes/modules integration into a centrally maintained suite of IM tools achieved within the EUROfusion Consortium is presented. Physics phenomena that can now be reasonably modelled in various areas (core turbulence and magnetic reconnection, edge and scrape-off layer physics, radio-frequency heating and current drive, magnetohydrodynamic model, reflectometry simulations) following successful code optimizations and parallelization are briefly described. Development activities in support to IM are summarized. They include support to (1) the local deployment of the IM infrastructure and access to experimental data at various host sites, (2) the management of releases for sophisticated IM workflows involving a large number of components, and (3) the performance optimization of complex IM workflows.This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014 to 2018 under grant agreement 633053. The views and opinions expressed herein do not necessarily reflect those of the European Commission or ITER.Peer ReviewedPostprint (published version

    FAIR Data Pipeline: provenance-driven data management for traceable scientific workflows

    Get PDF
    Modern epidemiological analyses to understand and combat the spread of disease depend critically on access to, and use of, data. Rapidly evolving data, such as data streams changing during a disease outbreak, are particularly challenging. Data management is further complicated by data being imprecisely identified when used. Public trust in policy decisions resulting from such analyses is easily damaged and is often low, with cynicism arising where claims of "following the science" are made without accompanying evidence. Tracing the provenance of such decisions back through open software to primary data would clarify this evidence, enhancing the transparency of the decision-making process. Here, we demonstrate a Findable, Accessible, Interoperable and Reusable (FAIR) data pipeline developed during the COVID-19 pandemic that allows easy annotation of data as they are consumed by analyses, while tracing the provenance of scientific outputs back through the analytical source code to data sources. Such a tool provides a mechanism for the public, and fellow scientists, to better assess the trust that should be placed in scientific evidence, while allowing scientists to support policy-makers in openly justifying their decisions. We believe that tools such as this should be promoted for use across all areas of policy-facing research

    Digital twin applications for the JET divertor

    No full text
    Digital twin techniques enhance traditional engineering analysis workflows of existing systems when a realistic evaluation of a component under complex operating conditions is required. During preparation, commissioning and operating phases, components can be virtually tested by using validated numerical models, operational expertise, and experimental databases.Three complementary applications have been developed under this approach. The numerical models used for the divertor tiles are based on continuum mechanics formulations. Their loading conditions are defined using the current physics and engineering understanding of a combination of experimental measurements. The aim of these tools is to increase operational range, reliability, and predictability of the JET divertor
    corecore